Accelerated Backpropagation Learning: Two Optimization Methods
نویسنده
چکیده
Two methods for incr easing performance of th e backpropagat ion learning algorithm are present ed and their result s are compared with those obtained by optimi zing par ameters in the standard method . The first method requires adaptation of a scalar learning rat e in order to decrease th e energy value along the gradient direction in a close-to-optimal way. Th e second is derived from the conjugate gradient method with inexact linear searches . The strict locality requirement is relaxed but parallelism of computation is maintained, allowing efficient use of concurrent computation. For medium-size probl ems, typical speedups of one order of magnitude are obtained.
منابع مشابه
Accelerated Backpropagation Learning : Parallel Tangent Optimization Algorithm
A modi ed backpropagation learning algorithm for training arti cial neural networks using de ecting gradient technique, which may be considered as a special case of the conjugate gradient methods, is proposed. Parallel tangent(Partan) gradient is used as an alternative for momentum term to accelerate the convergence. Partan gradient consists of two phases namely, climbing through gradient and a...
متن کاملAccelerated Backpropagation Learning: Extended Dynamic Parallel Tangent Optimization Algorithm
متن کامل
Deep Online Convex Optimization by Putting Forecaster to Sleep
Methods from convex optimization such as accelerated gradient descent are widely used as building blocks for deep learning algorithms. However, the reasons for their empirical success are unclear, since neural networks are not convex and standard guarantees do not apply. This paper develops the first rigorous link between online convex optimization and error backpropagation on convolutional net...
متن کاملRevisit Long Short-Term Memory: An Optimization Perspective
Long Short-Term Memory (LSTM) is a deep recurrent neural network architecture with high computational complexity. Contrary to the standard practice to train LSTM online with stochastic gradient descent (SGD) methods, we propose a matrix-based batch learning method for LSTM with full Backpropagation Through Time (BPTT). We further solve the state drifting issues as well as improving the overall ...
متن کاملRelational Databases Query Optimization using Hybrid Evolutionary Algorithm
Optimizing the database queries is one of hard research problems. Exhaustive search techniques like dynamic programming is suitable for queries with a few relations, but by increasing the number of relations in query, much use of memory and processing is needed, and the use of these methods is not suitable, so we have to use random and evolutionary methods. The use of evolutionary methods, beca...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Complex Systems
دوره 3 شماره
صفحات -
تاریخ انتشار 1989